Towards Accurate Markerless Human Shape and Pose Estimation over Time

ثبت نشده
چکیده

We address the problem of accurately estimating human shape, pose, and motion from images and video without markers or special cameras. Existing methods often assume known backgrounds, static cameras, and sequence specific motion priors. Here we propose a method that is fully automatic and, given multi-view video, estimates 3D human motion and body shape. Our work is built upon the recent SMPLify [10] method, which fits a 3D body model to 2D joints detected by a CNN in a single image. We extend this in several ways. First we fit the body to 2D features detected in multi-view images. Second, we use a CNN method to automatically segment the person in each image and fit the 3D human shape and pose to the contours; this further improves accuracy. The 2D pose CNN sometimes confuses left and right sides of the body, causing large errors. Third we address this with a robust temporal prior term. Rather than use a learned, sequence specific prior, we use a generic DCT basis applied to the reconstructed 3D joints. We formulate this as a coherent objective function, giving a fully automatic markerless mocap system we call MuVS for Multi-View SMPLify. We compare with ground truth 3D shape and pose on the HumanEva and Human3.6M datasets. Our results are significantly more accurate than the state of the art and provide a realistic 3D shape avatar.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Towards Accurate Markerless Human Shape and Pose Estimation over Time

Existing markerless motion capture methods often assume known backgrounds, static cameras, and sequence specific motion priors, limiting their application scenarios. Here we present a fully automatic method that, given multi-view videos, estimates 3D human pose and body shape. We take the recently proposed SMPLify method [12] as the base method and extend it in several ways. First we fit a 3D h...

متن کامل

Towards markerless motion capture: model estimation, initialization and tracking

Title of dissertation Towards Markerless Motion Capture: Model Estimation, Initialization and Tracking Aravind Sundaresan, Doctor of Philosophy, 2007 Directed by Professor Ramalingam Chellappa Department of Electrical and Computer Engineering Motion capture is an important application in diverse areas such as bio-mechanics, computer animation, and human-computer interaction. Current motion capt...

متن کامل

استفاده از برآورد حالت‌های پویای دست مبتنی بر مدل، برای تقلید عملکرد بازوی انسان توسط ربات با داده‌های کینکت

Pose estimation is a process to identify how a human body and/or individual limbs are configured in a given scene. Hand pose estimation is an important research topic which has a variety of applications in human-computer interaction (HCI) scenarios, such as gesture recognition, animation synthesis and robot control. However, capturing the hand motion is quite a challenging task due to its high ...

متن کامل

Virtual Visual Servoing for Real-Time Robot Pose Estimation

We propose a system for markerless pose estimation and tracking of a robot manipulator. By tracking the manipulator, we can obtain an accurate estimate of its position and orientation necessary in many object grasping and manipulation tasks. Tracking the manipulator allows also for better collision avoidance. The method is based on the notion of virtual visual servoing. We also propose the use ...

متن کامل

Towards Symmetry Axis based Markerless Motion Capture

A natural interaction with virtual environments is one of the key issues for the usability of Virtual Reality applications. Device-free, intuitive interactions with the virtual world can be achieved by capturing the movements of the user with markerless motion capture. In this work we present a markerless motion capture approach which can be used to estimate the human body pose in real-time wit...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2017